Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available May 6, 2026
-
Free, publicly-accessible full text available December 4, 2025
-
As urban populations grow, cities are becoming more complex, driving the deployment of interconnected sensing systems to realize the vision of smart cities. These systems aim to improve safety, mobility, and quality of life through applications that integrate diverse sensors with real-time decision-making. Streetscape applications—focusing on challenges like pedestrian safety and adaptive traffic management—depend on managing distributed, heterogeneous sensor data, aligning information across time and space, and enabling real-time processing. These tasks are inherently complex and often difficult to scale. The Streetscape Application Services Stack (SASS) addresses these challenges with three core services: multimodal data synchronization, spatiotemporal data fusion, and distributed edge computing. By structuring these capabilities as clear, composable abstractions with clear semantics, SASS allows developers to scale streetscape applications efficiently while minimizing the complexity of multimodal integration. We evaluated SASS in two real-world testbed environments: a controlled parking lot and an urban intersection in a major U.S. city. These testbeds allowed us to test SASS under diverse conditions, demonstrating its practical applicability. The Multimodal Data Synchronization service reduced temporal misalignment errors by 88%, achieving synchronization accuracy within 50 milliseconds. Spatiotemporal Data Fusion service improved detection accuracy for pedestrians and vehicles by over 10%, leveraging multicamera integration. The Distributed Edge Computing service increased system throughput by more than an order of magnitude. Together, these results show how SASS provides the abstractions and performance needed to support real-time, scalable urban applications, bridging the gap between sensing infrastructure and actionable streetscape intelligence.more » « lessFree, publicly-accessible full text available November 1, 2025
-
Blind and low-vision (BLV) people rely on GPS-based systems for outdoor navigation. GPS's inaccuracy, however, causes them to veer off track, run into obstacles, and struggle to reach precise destinations. While prior work has made precise navigation possible indoors via hardware installations, enabling this outdoors remains a challenge. Interestingly, many outdoor environments are already instrumented with hardware such as street cameras. In this work, we explore the idea of repurposing existing street cameras for outdoor navigation. Our community-driven approach considers both technical and sociotechnical concerns through engagements with various stakeholders: BLV users, residents, business owners, and Community Board leadership. The resulting system, StreetNav, processes a camera's video feed using computer vision and gives BLV pedestrians real-time navigation assistance. Our evaluations show that StreetNav guides users more precisely than GPS, but its technical performance is sensitive to environmental occlusions and distance from the camera. We discuss future implications for deploying such systems at scalemore » « less
-
Social distancing is an effective public health tool to reduce the spread of respiratory pandemics such as COVID-19. To analyze compliance with social distancing policies, we design two video-based pipelines for social distancing analysis, namely, Auto-SDA and B-SDA. Auto-SDA (Automated video-based Social Distancing Analyzer) is designed to measure social distancing using street-level cameras. To avoid privacy concerns of using street-level cameras, we further develop B-SDA (Bird’s eye view Social Distancing Analyzer), which uses bird’s eye view cameras, thereby preserving pedestrian’s privacy. We used the COSMOS testbed deployed in West Harlem, New York City, to evaluate both pipelines. In particular, Auto-SDA and B-SDA are applied on videos recorded by two of COSMOS cameras deployed on the 2nd floor (street-level) and 12th floor (bird’s eye view) of Columbia University’s Mudd building, looking at 120th St. and Amsterdam Ave. intersection, New York City. Videos are recorded before and during the peak of the pandemic, as well as after the vaccines became broadly available. The results represent the impact of social distancing policies on pedestrians’ social behavior. For example, the analysis shows that after the lockdown, less than 55% of the pedestrians failed to adhere to the social distancing policies, whereas this percentage increased to 65% after the vaccines’ availability. Moreover, after the lockdown, 0-20% of the pedestrians were affiliated with a social group, compared to 10-45% once the vaccines became available. The results also show that the percentage of face-to-face failures has decreased from 42.3% (pre-pandemic) to 20.7%(after the lockdown).more » « less
-
Crowded metropolises present unique challenges to the potential deployment of autonomous vehicles. Safety of pedestrians cannot be compromised and personal privacy must be preserved. Smart city intersections will be at the core of Artificial Intelligence (AI)-powered citizen-friendly traffic management systems for such metropolises. Hence, the main objective of this work is to develop an experimentation framework for designing applications in support of secure and efficient traffic intersections in urban areas. We integrated a camera and a programmable edge computing node, deployed within the COSMOS testbed in New York City, with an Eclipse sensiNact data platform provided by Kentyou. We use this pipeline to collect and analyze video streams in real-time to support smart city applications. In this demo, we present a video analytics pipeline that analyzes the video stream from a COSMOS’ street-level camera to extract traffic/crowd-related information and sends it to a dedicated dashboard for real-time visualization and further assessment. This is done without sending the raw video, in order to avoid violating pedestrians’ privacy.more » « less
An official website of the United States government

Full Text Available